Bounds on the information divergence for hypergeometric distributions

نویسندگان

چکیده

The hypergeometric distributions have many important applications, but they not had sufficient attention in information theory. Hypergeometric can be approximated by binomial or Poisson distributions. In this paper we present upper and lower bounds on divergence. These are for statistical testing a better understanding of the notion exchange-ability.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Lower bounds on Information Divergence

In this paper we establish lower bounds on information divergence from a distribution to certain important classes of distributions as Gaussian, exponential, Gamma, Poisson, geometric, and binomial. These lower bounds are tight and for several convergence theorems where a rate of convergence can be computed, this rate is determined by the lower bounds proved in this paper. General techniques fo...

متن کامل

Exponential bounds for the hypergeometric distribution.

We establish exponential bounds for the hypergeometric distribution which include a finite sampling correction factor, but are otherwise analogous to bounds for the binomial distribution due to León and Perron (Statist. Probab. Lett.62 (2003) 345-354) and Talagrand (Ann. Probab.22 (1994) 28-76). We also extend a convex ordering of Kemperman's (Nederl. Akad. Wetensch. Proc. Ser. A76 = Indag. Mat...

متن کامل

On bounds involving k-Appell’s hypergeometric functions

In this paper, we derive a new extension of Hermite-Hadamard's inequality via k-Riemann-Liouville fractional integrals. Two new k-fractional integral identities are also derived. Then, using these identities as an auxiliary result, we obtain some new k-fractional bounds which involve k-Appell's hypergeometric functions. These bounds can be viewed as new k-fractional estimations of trapezoidal a...

متن کامل

On Classification of Bivariate Distributions Based on Mutual Information

Among all measures of independence between random variables, mutual information is the only one that is based on information theory. Mutual information takes into account of all kinds of dependencies between variables, i.e., both the linear and non-linear dependencies. In this paper we have classified some well-known bivariate distributions into two classes of distributions based on their mutua...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Kybernetika

سال: 2021

ISSN: ['1805-949X', '0023-5954']

DOI: https://doi.org/10.14736/kyb-2020-6-1111